VC-dimension and Erd\H{o}s-P\'osa property
نویسندگان
چکیده
Let G = (V,E) be a graph. A k-neighborhood in G is a set of vertices consisting of all the vertices at distance at most k from some vertex of G. The hypergraph on vertex set V which edge set consists of all the k-neighborhoods of G for all k is the neighborhood hypergraph of G. Our goal in this paper is to investigate the complexity of a graph in terms of its neighborhoods. Precisely, we define the distance VCdimension of a graph G as the maximum taken over all induced subgraphs G′ of G of the VC-dimension of the neighborhood hypergraph of G′. For a class of graphs, having bounded distance VC-dimension both generalizes minor closed classes and graphs with bounded clique-width. Our motivation is a result of Chepoi, Estellon and Vaxès [5] asserting that every planar graph of diameter 2` can be covered by a bounded number of balls of radius `. In fact, they obtained the existence of a function f such that every set F of balls of radius ` in a planar graph admits a hitting set of size f(ν) where ν is the maximum number of pairwise disjoint elements of F . Our goal is to generalize the proof of [5] with the unique assumption of bounded distance VC-dimension of neighborhoods. In other words, the set of balls of fixed radius in a graph with bounded distance VCdimension has the Erdős-Pósa property.
منابع مشابه
Erdös-Hajnal Conjecture for Graphs with Bounded VC-Dimension
The Vapnik-Chervonenkis dimension (in short, VC-dimension) of a graph is defined as the VCdimension of the set system induced by the neighborhoods of its vertices. We show that every n-vertex graph with bounded VC-dimension contains a clique or an independent set of size at least e(logn) . The dependence on the VC-dimension is hidden in the o(1) term. This improves the general lower bound, e √ ...
متن کاملTight Bounds for the VC-Dimension of Piecewise Polynomial Networks
O(ws(s log d+log(dqh/ s))) and O(ws((h/ s) log q) +log(dqh/ s)) are upper bounds for the VC-dimension of a set of neural networks of units with piecewise polynomial activation functions, where s is the depth of the network, h is the number of hidden units, w is the number of adjustable parameters, q is the maximum of the number of polynomial segments of the activation function, and d is the max...
متن کاملTight Bounds for the VC-Dimension of Feedforward and Recurrent Networks of Piecewise Polynomial Activation Function Units
We consider the VC-dimension of a set of the neural networks of depth s with w adjustable parameters that has h hidden units of activation functions of at most q segments of degree at most d polynomials. When d 2 and q 2, the VC-dimension is O(ws(s log d + log(qh=s))), O(ws((h=s) log q) + log d), and (ws log(dqh=s)). When d 2 and q = 1, it is (ws log d). When d = 1 and q 2, it is (ws log(qh=s))...
متن کاملEfficient Arithmetic Regularity and Removal Lemmas for Induced Bipartite Patterns
Let G be an abelian group of bounded exponent and A ⊂ G. We show that if the collection of translates of A has VC dimension at most d, then for every > 0 there is a subgroup H of G of index at most −d−o(1) such that one can add or delete at most |G| elements to A to make it a union of H-cosets. We also establish a removal lemma with polynomial bounds, with applications to property testing, for ...
متن کاملThe VC Dimension for Mixtures ofBinary Classi
The mixtures-of-experts (ME) methodology provides a tool of classii-cation when experts of logistic regression models or Bernoulli models are mixed according to a set of local weights. We show that the Vapnik-Chervonenkis (VC) dimension of the mixtures-of-experts architecture is bounded below by the number of experts m, and is bounded above by O(m 4 s 2), where s is the dimension of the input. ...
متن کامل